Attractor Metadynamics in Adapting Neural Networks

نویسندگان

  • Claudius Gros
  • Mathias Linkerhand
  • Valentin Walther
چکیده

Slow adaption processes, like synaptic and intrinsic plasticity, abound in the brain and shape the landscape for the neural dynamics occurring on substantially faster timescales. At any given time the network is characterized by a set of internal parameters, which are adapting continuously, albeit slowly. This set of parameters defines the number and the location of the respective adiabatic attractors. The slow evolution of network parameters hence induces an evolving attractor landscape, a process which we term attractor metadynamics. We study the nature of the metadynamics of the attractor landscape for several continuoustime autonomous model networks. We find both firstand second-order changes in the location of adiabatic attractors and argue that the study of the continuously evolving attractor landscape constitutes a powerful tool for understanding the overall development of the neural dynamics.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Attractor metadynamics in a recurrent neural network: adiabatic vs. symmetry protected flow

In dynamical systems with distinct time scales the time evolution in phase space may be influenced strongly by slow manifolds. Orbits then typically follow the slow manifold, which hence act as a transient attractor, performing in addition rapid transitions between distinct branches of the slow manifold on the time scales of the fast variables. These intermittent transitions correspond to state...

متن کامل

Generating functionals for autonomous latching dynamics in attractor relict networks

Coupling local, slowly adapting variables to an attractor network allows to destabilize all attractors, turning them into attractor ruins. The resulting attractor relict network may show ongoing autonomous latching dynamics. We propose to use two generating functionals for the construction of attractor relict networks, a Hopfield energy functional generating a neural attractor network and a fun...

متن کامل

بهبود بازشناسی مقاوم الگو در شبکه های عصبی بازگشتی جاذب از طریق به کارگیری دینامیک های آشوب گونه

In this paper, two kinds of chaotic neural networks are proposed to evaluate the efficiency of chaotic dynamics in robust pattern recognition. The First model is designed based on natural selection theory. In this model, attractor recurrent neural network, intelligently, guides the evaluation of chaotic nodes in order to obtain the best solution. In the second model, a different structure of ch...

متن کامل

Learning in sparse attractor networks with inhibition

Attractor networks are important models for brain functions on a behavioral and physiological level, but learning on sparse patterns has not been fully explained. Here we show that the inclusion of the activity dependent effect of an inhibitory pool in Hebbian learning can accomplish learning of stable sparse attractors in both, continuous attractor and point attractor neural networks.

متن کامل

Attractor Density Models with Application to Analyzing the Stability of Biological Neural Networks

An attractor modeling algorithm is introduced which draws upon techniques found in nonlineax dynamics and pattern recognition. The technique is motivated. by the need for quantitative measures that are able to assess the stability of biological neural networks which utilize nonlinear dynamics to process information. Attractor Density Models with Application to Analyzing the Stability of Biologi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014